Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution
نویسندگان
چکیده
منابع مشابه
Goodness of Fit Test for Gumbel Distribution Based on Kullback-Leibler Information Using Several Different Estimators
In this paper, our objective is to test the statistical hypothesis : ( ) ( ) for all against : ( ) ( ) 1 H F x F x x H F x F x o o o = ≠ for some , x where ( ) F x o is a known distribution function. In this study, a goodness of fit test statistics for Gumbel distribution based on Kullback-Leibler information is studied. The performance of the test under simple random sampling is investigated u...
متن کاملGaussian Kullback-Leibler approximate inference
We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scal...
متن کاملGoodness–of–fit Tests for the Inverse Gaussian Distribution Based on the Empirical Laplace Transform
This paper considers two flexible classes of omnibus goodness-of-fit tests for the inverse Gaussian distribution. The test statistics are weighted integrals over the squared modulus of some measure of deviation of the empirical distribution of given data from the family of inverse Gaussian laws, expressed by means of the empirical Laplace transform. Both classes of statistics are connected to t...
متن کاملAlternative Kullback-Leibler information entropy for enantiomers.
In our series of studies on quantifying chirality, a new chirality measure is proposed in this work based on the Kullback-Leibler information entropy. The index computes the extra information that the shape function of one enantiomer carries over a normalized shape function of the racemate, while in our previous studies the shape functions of the R and S enantiomers were used considering one as...
متن کاملBootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Korean Journal of Applied Statistics
سال: 2011
ISSN: 1225-066X
DOI: 10.5351/kjas.2011.24.6.1271